Web Survey Bibliography
Title Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance for data quality
Author Wetzlehuetter, D.
Year 2017
Access date 15.09.2017
Abstract Starting point and focus: It is not possible to ignore the internet as a quick, practicable and economic source of information and nearly unlimited communication channel, as a mass medium (online news), a mainstream medium (social media) as well as an individual medium (email). The number of web surveys and methods of taking web surveys increased with the utilisation of the internet. For instance, the Arbeitskreis Deutscher Markt- und Sozialforschungsinstitute e.V. recorded a continuous increase from 1% quantitative web surveys of their members in 1998 to 16% in 2004, 38% in 2010 and 43% in 2014. However, webbased surveys – as extensive discussions show – are not free of controversy. Questable data quality, typically regarding the representativeness of the data (coverage error / missing data) and difficulties to achieve unbiased responses (measurement errors) caused by the equipment used (mode-effects) is more and more common. Errors caused by continuous rising proportions of drop-outs and item-nonresponses in online surveys, are relevant in almost the same manner. However, these sources of error are repeatedly neglected to a certain degree.
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (364)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Comparing acquiescent and extreme response styles in face-to-face and web surveys; 2017; Liu, M.; Conrad, F. G.; Lee, S.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- Methods for Evaluating Respondent Attrition in Web-Based Surveys; 2016; Hochheimer, C. J.; Sabo, R. T.; Krist, A. H.; Day, T.; Cyrus, J.; Woolf, S. H.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Using official surveys to reduce bias of estimates from nonrandom samples collected by web surveys; 2016; Beresovsky, V.; Dorfman, A.; Rumcheva, P.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- Helping respondents provide good answers in Web surveys; 2016; Couper, M. P.; Zhang, C.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Research gamification for quality pharmaceutical stakeholder insights; 2016; Mondry, B.; Fink, L.
- SurveyTester from Knowledge Navigators ; 2016; Macer, T.
- Simplifying your mobile solution; 2016; Berry, K.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Why Do Web Surveys Take Longer on Smartphones?; 2016; Couper, M. P.; J. J.Peterson, G. J.
- Usability Testing within Agile Process; 2016; Holland, T.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Thinking Inside the Box Visual Design of the Response Box Affects Creative Divergent Thinking in an...; 2016; Mohr, A. H.; Sell, A.; Lindsay, T.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.